132 research outputs found

    The implementation of international geospatial standards for earth and space sciences

    Get PDF
    The Earth and Space Sciences Informatics division of European Geosciences Union (EGU) and the Open Geospatial Consortium jointly organised a special event entitled: 'Implementation of international geospatial standards for earth and space sciences event' - at the EGU General Assembly meeting held in Vienna, April 2009. The event objectives included: (a) to discuss the integration of information systems from different geosciences disciplines; (b) to promote and discuss the present process to scale from specific and monolithic systems towards independent and modular enabling infrastructures - forming an earth system science (ESS) infrastructure; and (c) to show some of the latest advances in implementing open standards. This manuscript introduces the event motivations and describes the abstract and holistic framework, which can be used to situate the topics and the developments presented by the event speakers. This manuscript introduces important, and relatively new technologies to build a multi-disciplinary geosciences information system: the System of Systems approach and the Model Driven Approach. To achieve that, three important information infrastructure categories are recognised: (a) ESS information infrastructure; (b) geospatial information infrastructure; and (c) distributed information infrastructure. Digital Earth should support the discussed framework to accelerate information transfer from theoretical discussions to applications, in all fields related to global climate change, natural disaster prevention and response, new energy-source development, agricultural and food security, and urban planning and management

    Unidata's Common Data Model mapping to the ISO 19123 Data Model

    Get PDF
    Access to real-time distributed Earth and Space Science (ESS) information is essential for enabling critical Decision Support Systems (DSS). Thus, data model interoperability between the ESS and DSS communities is a decisive achievement for enabling cyber-infrastructure which aims to serve important societal benefit areas. The ESS community is characterized by a certain heterogeneity, as far as data models are concerned. Recent spatial data infrastructures implement international standards for the data model in order to achieve interoperability and extensibility. This paper presents well-accepted ESS data models, introducing a unified data model called the Common Data Model (CDM). CDM mapping into the corresponding elements of the international standard coverage data model of ISO 19123 is presented and discussed at the abstract level. The mapping of CDM scientific data types to the ISO coverage model is a first step toward interoperability of data systems. This mapping will provide the abstract framework that can be used to unify subsequent efforts to define appropriate conventions along with explicit agreed-upon encoding forms for each data type. As a valuable case in point, the content mapping rules for CDM grid data are discussed addressing a significant example

    Integrating CERIF Entities in a Multidisciplinary e-infrastructure for Environmental Research Data

    Get PDF
    AbstractThe paper proposes different solutions to integrate CERIF in the environmental dataset domain, based on the quality of semantic mapping as well as on the characteristics of the CERIF data model. A two-way crosswalk is described resulting in the identification of a core of corresponding metadata and a proposal of extensions of the CERIF model. Extensions of ISO concepts are also described to provide contextual research information in the domain of environmental research data. Finally, the crosswalk has been implemented in the GI-cat discovery broker framework. Successful tests demonstrated the possibility for CERIF information to be integrated in ISO compliant infrastructures and for INSPIRE information to be captured in CERIF

    Exploring the depths of the global earth observation system of systems

    Get PDF
    Big Earth Data-Cube infrastructures are becoming more and more popular to provide Analysis Ready Data, especially for managing satellite time series. These infrastructures build on the concept of multidimensional data model (data hypercube) and are complex systems engaging different disciplines and expertise. For this reason, their interoperability capacity has become a challenge in the Global Change and Earth System science domains. To address this challenge, there is a pressing need in the community to reach a widely agreed definition of Data-Cube infrastructures and their key features. In this respect, a discussion has started recently about the definition of the possible facets characterizing a Data-Cube in the Earth Observation domain. This manuscript contributes to such debate by introducing a view-based model of Earth Data-Cube systems to design its infrastructural architecture and content schemas, with the final goal of enabling and facilitating interoperability. It introduces six modeling views, each of them is described according to: its main concerns, principal stakeholders, and possible patterns to be used. The manuscript considers the Business Intelligence experience with Data Warehouse and multidimensional "cubes" along with the more recent and analogous development in the Earth Observation domain, and puts forward a set of interoperability recommendations based on the modeling views

    The uncertainty enabled model web (UncertWeb)

    Get PDF
    UncertWeb is a European research project running from 2010-2013 that will realize the uncertainty enabled model web. The assumption is that data services, in order to be useful, need to provide information about the accuracy or uncertainty of the data in a machine-readable form. Models taking these data as imput should understand this and propagate errors through model computations, and quantify and communicate errors or uncertainties generated by the model approximations. The project will develop technology to realize this and provide demonstration case studies

    Destination Earth: Use Cases Analysis

    Get PDF
    Destination Earth (DestinE) is an initiative initiated and coordinated by the European Commission’s Directorate-General for Communications Networks, Content and Technology (DG CNECT) in support of the European Green Deal1 and as contribution to the establishment of the Green Deal Data Space, one of several data spaces envisaged in the European Strategy for Data (COM2020 66 final). The overall objective of DestinE is to develop a service infrastructure that: - serves specific EU needs based on clearly identified EU policy priorities and user needs in relation to e.g. the Green Deal, and - is at the same time firmly based on European values, such as commitment to quality and transparency in order to build trust in evidence-based policy-making among all stakeholders. DestinE will include a shared horizontal layer including computer processing, data, software, and infrastructure and some vertical applications, Digital Twins (DTs), in selected thematic areas responding to priority policy use cases. To identify these priorities, the Joint Research Centre of the European Commission (JRC) has been tasked by DG CNECT to collect a number of potential use cases for DestinE representing needs of policy DGs in the Commission. This document presents 30 use cases received from six policy DGs, the JRC and 5 relevant European stakeholders. The use cases were preliminarily evaluated and clustered according to their assumed maturity level in terms of policy, scientific and anticipatory potential. Following a series of interactions with all stakeholders consulted, JRC identified two initial DTs on: - Extreme Earth issues (disaster risk management in relation to extreme weather-induced natural disasters); - Climate change adaptation issues (primarily food and water supply security). Another DT on digital oceans (around food and energy issues) was introduced as a suggestion to DG CNECT to be possibly developed in the second phase of DestinE’s implementation. Although this report is certainly neither exhaustive nor fully descriptive e.g. in relation to the assumed maturity levels of the mentioned use cases, it served nevertheless as a useful input to DG CNECT’s final definition of the scope of DTs whose development would be prioritized in the course of the DestinE implementation.JRC.B.6-Digital Econom

    D4.2. Observation inventory description and results report

    Get PDF
    The ConnectinGEO Observation Inventory (OI) is created and populated using the current information in the metadata concentrated in the GEO Discovery and Access Broker (DAB) of the GEOSS Common Infrastructure (GCI) to analyse the observations and measurements currently available in it. WP4 defined a high-level process for the population of the Observation Inventory: (i) retrieve the full metadata content for each record in the GEO DAB, (ii) extract/Infer extra semantics (connecting to external knowledge systems when needed), and (iii) generate enriched metadata and write it to the OI. The OI system architecture was designed and developed. The first version of the OI was created and populated using the current information in the metadata concentrated in the GEO DAB. The first population process was run in December 2015, resulting in a total of more than 1.6M harvested metadata records. The developed OI is accessible online and can be used as a data source by different analysis tools, which create plots, report, or summary statistics useful for the ConnectinGEO gap analysis. A simple Web Client was developed to demonstrate how to interrogate the OI and provide also basic examples of how the developed OI can be used by web-based analysis tools

    Geo-processing in cyberinfrastructure: making the web an easy to use geospatial computational platform

    No full text
    International audienceAccess to data on the web has become routine based upon open standards from IETF and W3C. Access to explicitly geospatial data is routinely done using data access standards from the OGC. Geoprocessing services on the web are now being developed. Processing of data must be done to apply or fuse the data to meet specific applications. Standards and implementations for processing of data on the web are just now becoming established. For geospatial data, the OGC has defined the Web Processing Service (WPS) interface standard. Now is a critical time to bring convergence to WPS profiles that make the web an easy to use geospatial computational service. Access to network accessible processing services is bringing geoprocessing to the cyberinfrastructure
    • …
    corecore